Common Information Components Analysis
نویسندگان
چکیده
Wyner’s common information is a measure that quantifies and assesses the commonality between two random variables. Based on this, we introduce novel two-step procedure to construct features from data, referred as Common Information Components Analysis (CICA). The first step can be interpreted an extraction of information. second form back-projection onto original variables, leading extracted features. A free parameter γ controls complexity We establish that, in case Gaussian statistics, CICA precisely reduces Canonical Correlation (CCA), where determines number CCA components are extracted. In this sense, rigorous connection measures CCA, strict generalization latter. It shown has several desirable features, including natural extension beyond just data sets.
منابع مشابه
Eciency Measurement of Multiple Components units in Data Envelopment Analysis Using Common Set of Weights
متن کامل
content analysis of health information components in school textbooks
introduction: since health is one of god's great bounties bestowed on human, he should attempt to keep it especially in today's machine life. therefore, people have to be educated before any kind of disease occurs. this study aimed to analyze the contents of secondary school textbooks regarding health education indices such as skin health, oral hygiene, nutrition, sports and physical activity, ...
متن کاملFunctional common principal components models
In this paper, we discuss the extension to the functional setting of the common principal component model that has been widely studied when dealing with multivariate observations. We provide estimators of the common eigenfunctions and study their asymptotic behavior.
متن کاملCommon Functional Principal Components 1
Functional principal component analysis (FPCA) based on the Karhunen–Loève decomposition has been successfully applied in many applications, mainly for one sample problems. In this paper we consider common functional principal components for two sample problems. Our research is motivated not only by the theoretical challenge of this data situation, but also by the actual question of dynamics of...
متن کاملDetecting influential observations in principal components and common principal components
Detecting outlying observations is an important step in any analysis, even when robust estimates are used. In particular, the robustified Mahalanobis distance is a natural measure of outlyingness if one focuses on ellipsoidal distributions. However, it is well known that the asymptotic chi-square approximation for the cutoff value of the Mahalanobis distance based on several robust estimates (l...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Entropy
سال: 2021
ISSN: ['1099-4300']
DOI: https://doi.org/10.3390/e23020151